Maximum Probability and Relative Entropy Maximization. Bayesian Maximum Probability and Empirical Likelihood

نویسنده

  • M. Grendár
چکیده

Works, briefly surveyed here, are concerned with two basic methods: Maximum Probability and Bayesian Maximum Probability; as well as with their asymptotic instances: Relative Entropy Maximization and Maximum Non-parametric Likelihood. Parametric and empirical extensions of the latter methods – Empirical Maximum Maximum Entropy and Empirical Likelihood – are also mentioned. The methods are viewed as tools for solving certain ill-posed inverse problems, called Π-problem, Φ-problem, respectively. Within the two classes of problems, probabilistic justification and interpretation of the respective methods are discussed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Maximum Probability and Maximum Entropy methods: Bayesian interpretation

(Jaynes’) Method of (Shannon-Kullback’s) Relative Entropy Maximization (REM or MaxEnt) can be at least in the discrete case according to the Maximum Probability Theorem (MPT) viewed as an asymptotic instance of the Maximum Probability method (MaxProb). A simple bayesian interpretation of MaxProb is given here. MPT carries the interpretation over into REM.

متن کامل

Determination of Maximum Bayesian Entropy Probability Distribution

In this paper, we consider the determination methods of maximum entropy multivariate distributions with given prior under the constraints, that the marginal distributions or the marginals and covariance matrix are prescribed. Next, some numerical solutions are considered for the cases of unavailable closed form of solutions. Finally, these methods are illustrated via some numerical examples.

متن کامل

A maximum entropy approach to learn Bayesian networks from incomplete data

This paper addresses the problem of estimating the parameters of a Bayesian network from incomplete data. This is a hard problem, which for computational reasons cannot be effectively tackled by a full Bayesian approach. The workaround is to search for the estimate with maximum posterior probability. This is usually done by selecting the highest posterior probability estimate among those found ...

متن کامل

Hyperbolic Cosine Log-Logistic Distribution and Estimation of Its Parameters by Using Maximum Likelihood Bayesian and Bootstrap Methods

‎In this paper‎, ‎a new probability distribution‎, ‎based on the family of hyperbolic cosine distributions is proposed and its various statistical and reliability characteristics are investigated‎. ‎The new category of HCF distributions is obtained by combining a baseline F distribution with the hyperbolic cosine function‎. ‎Based on the base log-logistics distribution‎, ‎we introduce a new di...

متن کامل

Maximum Likelihood with Coarse Data based on Robust Optimisation

This paper deals with the problem of probability estimation in the context of coarse data. Probabilities are estimated using the maximum likelihood principle. Our approach presupposes that each imprecise observation underlies a precise one, and that the uncertainty that pervades its observation is epistemic, rather than representing noise. As a consequence, the likelihood function of the illobs...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008